Low-Rank Matrix Recovery via Efficient Schatten p-Norm Minimization
نویسندگان
چکیده
As an emerging machine learning and information retrieval technique, the matrix completion has been successfully applied to solve many scientific applications, such as collaborative prediction in information retrieval, video completion in computer vision, etc. The matrix completion is to recover a low-rank matrix with a fraction of its entries arbitrarily corrupted. Instead of solving the popularly used trace norm or nuclear norm based objective, we directly minimize the original formulations of trace norm and rank norm. We propose a novel Schatten p-Norm optimization framework that unifies different norm formulations. An efficient algorithm is derived to solve the new objective and followed by the rigorous theoretical proof on the convergence. The previous main solution strategy for this problem requires computing singular value decompositions a task that requires increasingly cost as matrix sizes and rank increase. Our algorithm has closed form solution in each iteration, hence it converges fast. As a consequence, our algorithm has the capacity of solving large-scale matrix completion problems. Empirical studies on the recommendation system data sets demonstrate the promising performance of our new optimization framework and efficient algorithm. Introduction In many machine learning applications measured data can be represented as a matrix M ∈ Rn×m, for which only a relatively small number of entries are observed. The matrix completion problem is to find a matrix with low rank or low norm based on the observed entries, and has been actively studied in statistical learning, optimization, and information retrieval areas (Candes and Recht 2008; Candes and Tao 2009; Cai, Candes, and Shen 2008; Rennie and Srebro 2005). Such formulations occurred in many recent machine learning applications such as recommender system and collaborative prediction (Srebro, Rennie, and Jaakkola 2004; Rennie and Srebro 2005; Abernethy et al. 2009), multitask learning (Abernethy et al. 2006; Pong et al. 2010; Argyriou, Evgeniou, and Pontil 2008), image/video completion (Liu et al. 2009), and classification with multiple classes (Amit et al. 2007). Copyright c © 2012, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved. The matrix completion problem of recovering a low-rank matrix from a subset of its entries is, min X∈Rn×m rank(X), s.t. Xij = Tij ∀ (i, j) ∈ Ω, (1) where rank(X) denotes the rank of matrix X, and Tij ∈ R are observed entries from entries set Ω. Directly solve the problem (1) is difficult as the rank minimization problem is known as NP-hard. Recently, (M.Fazel 2002) proved the trace norm function is the convex envelope of the rank function over the unit ball of matrices, and thus the trace norm is the best convex approximation of the rank function. More recently, it has been shown in (Candes and Recht 2008; Candes and Tao 2009; Recht, Fazel, and Parrilo 2010) that, under some conditions, the solution of problem in Eq. (1) can be found by solving the following convex optimization problem: min X∈Rn×m ‖X‖∗ , s.t. Xij = Tij ∀ (i, j) ∈ Ω, (2) where ‖X‖∗ is the trace norm of X. Several methods (Toh and Yun 2009; Ji and Ye 2009; Liu, Sun, and Toh 2009; Ma, Goldfarb, and Chen 2009; Mazumder, Hastie, and Tibshirani 2009) recently have been published to solve this kind of trace norm minimization problem. In this paper, we propose a new optimization framework to discover low-rank matrix with Schatten p-norm, which can be used to solve problems in both Eq. (1) and Eq. (2). When p = 1, we have the trace norm formulation as Eq. (2); when p→ 0, the objective becomes Eq. (1). We introduce an efficient algorithm to solve the Schatten p-norm minimization problem with guaranteed convergence. We rigorously prove the algorithm monotonically decreases the objective with 0 < p ≤ 2 that covers the range we are interested in. Empirical studies demonstrate the promising performance of our optimization framework. Recover Low-Rank Matrix with Schatten p-Norm The Schatten p-Norm Definitions on Matrices In this paper, all matrices are written as boldface uppercase and vectors are written as boldface lowercase. For matrix M, the i-th column, the i-th row and the ij-th entry of M Proceedings of the Twenty-Sixth AAAI Conference on Artificial Intelligence
منابع مشابه
Performance Guarantees for Schatten-$p$ Quasi-Norm Minimization in Recovery of Low-Rank Matrices
We address some theoretical guarantees for Schatten-p quasi-norm minimization (p ∈ (0, 1]) in recovering low-rank matrices from compressed linear measurements. Firstly, using null space properties of the measuring operator, we provide a sufficient condition for exact recovery of low-rank matrices. This condition guarantees unique recovery of matrices of ranks equal or larger than what is guaran...
متن کاملScalable Algorithms for Tractable Schatten Quasi-Norm Minimization
The Schatten-p quasi-norm (0<p<1) is usually used to replace the standard nuclear norm in order to approximate the rank function more accurately. However, existing Schattenp quasi-norm minimization algorithms involve singular value decomposition (SVD) or eigenvalue decomposition (EVD) in each iteration, and thus may become very slow and impractical for large-scale problems. In this paper, we fi...
متن کاملStability of low-rank matrix recovery and its connections to Banach space geometry
Abstract. There are well-known relationships between compressed sensing and the geometry of the finite-dimensional lp spaces. A result of Kashin and Temlyakov [20] can be described as a characterization of the stability of the recovery of sparse vectors via l1minimization in terms of the Gelfand widths of certain identity mappings between finitedimensional l1 and l2 spaces, whereas a more recen...
متن کاملJoint Schatten p - norm and p - norm robust matrix completion for missing value recovery
The low-rank matrix completion problem is a fundamental machine learning and data mining problem with many important applications. The standard low-rank matrix completion methods relax the rank minimization problem by the trace norm minimization. However, this relaxation may make the solution seriously deviate from the original solution. Meanwhile, most completion methods minimize the squared p...
متن کاملImproved dynamic MRI reconstruction by exploiting sparsity and rank-deficiency.
In this paper we address the problem of dynamic MRI reconstruction from partially sampled K-space data. Our work is motivated by previous studies in this area that proposed exploiting the spatiotemporal correlation of the dynamic MRI sequence by posing the reconstruction problem as a least squares minimization regularized by sparsity and low-rank penalties. Ideally the sparsity and low-rank pen...
متن کامل